45 research outputs found

    BRAHMS: Novel middleware for integrated systems computation

    Get PDF
    Biological computational modellers are becoming increasingly interested in building large, eclectic models, including components on many different computational substrates, both biological and non-biological. At the same time, the rise of the philosophy of embodied modelling is generating a need to deploy biological models as controllers for robots in real-world environments. Finally, robotics engineers are beginning to find value in seconding biomimetic control strategies for use on practical robots. Together with the ubiquitous desire to make good on past software development effort, these trends are throwing up new challenges of intellectual and technological integration (for example across scales, across disciplines, and even across time) - challenges that are unmet by existing software frameworks. Here, we outline these challenges in detail, and go on to describe a newly developed software framework, BRAHMS. that meets them. BRAHMS is a tool for integrating computational process modules into a viable, computable system: its generality and flexibility facilitate integration across barriers, such as those described above, in a coherent and effective way. We go on to describe several cases where BRAHMS has been successfully deployed in practical situations. We also show excellent performance in comparison with a monolithic development approach. Additional benefits of developing in the framework include source code self-documentation, automatic coarse-grained parallelisation, cross-language integration, data logging, performance monitoring, and will include dynamic load-balancing and 'pause and continue' execution. BRAHMS is built on the nascent, and similarly general purpose, model markup language, SystemML. This will, in future, also facilitate repeatability and accountability (same answers ten years from now), transparent automatic software distribution, and interfacing with other SystemML tools. (C) 2009 Elsevier Ltd. All rights reserved

    Technical Integration of Hippocampus, Basal Ganglia and Physical Models for Spatial Navigation

    Get PDF
    Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large-scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings

    Biomimetic tactile target acquisition, tracking and capture

    Get PDF
    Good performance in unstructured/uncertain environments is an ongoing problem in robotics; in biology, it is an everyday observation. Here, we model a particular biological system - hunting in the Etruscan shrew - as a case study in biomimetic robot design. These shrews strike rapidly and accurately after gathering very limited sensory information from their whiskers; we attempt to mimic this performance by using model-based simultaneous discrimination and localisation of a 'prey' robot (i.e. by using strong priors to make sense of limited sensory data), building on our existing low-level models of attention and appetitive behaviour in small mammals. We report performance that is comparable, given the spatial and temporal scale differences, to shrew performance, and discuss what this study reveals about biomimetic robot design in general. © 2013 Elsevier B.V. All rights reserved

    Fast, flexible closed-loop feedback: Tracking movement in “real-millisecond-time”

    Get PDF
    © 2019 Sehara et al. One of the principal functions of the brain is to control movement and rapidly adapt behavior to a changing external environment. Over the last decades our ability to monitor activity in the brain, manipulate it while also manipulating the environment the animal moves through, has been tackled with increasing sophistication. However, our ability to track the movement of the animal in real time has not kept pace. Here, we use a dynamic vision sensor (DVS) based event-driven neuromorphic camera system to implement real-time, low-latency tracking of a single whisker that mice can move at 25 Hz. The customized DVS system described here converts whisker motion into a series of events that can be used to estimate the position of the whisker and to trigger a position-based output interactively within 2 ms. This neuromorphic chip-based closed-loop system provides feedback rapidly and flexibly. With this system, it becomes possible to use the movement of whiskers or in principal, movement of any part of the body to reward, punish, in a rapidly reconfigurable way. These methods can be used to manipulate behavior, and the neural circuits that help animals adapt to changing values of a sequence of motor actions

    Modeling the Emergence of Whisker Direction Maps in Rat Barrel Cortex

    Get PDF
    Based on measuring responses to rat whiskers as they are mechanically stimulated, one recent study suggests that barrel-related areas in layer 2/3 rat primary somatosensory cortex (S1) contain a pinwheel map of whisker motion directions. Because this map is reminiscent of topographic organization for visual direction in primary visual cortex (V1) of higher mammals, we asked whether the S1 pinwheels could be explained by an input-driven developmental process as is often suggested for V1. We developed a computational model to capture how whisker stimuli are conveyed to supragranular S1, and simulate lateral cortical interactions using an established self-organizing algorithm. Inputs to the model each represent the deflection of a subset of 25 whiskers as they are contacted by a moving stimulus object. The subset of deflected whiskers corresponds with the shape of the stimulus, and the deflection direction corresponds with the movement direction of the stimulus. If these two features of the inputs are correlated during the training of the model, a somatotopically aligned map of direction emerges for each whisker in S1. Predictions of the model that are immediately testable include (1) that somatotopic pinwheel maps of whisker direction exist in adult layer 2/3 barrel cortex for every large whisker on the rat's face, even peripheral whiskers; and (2) in the adult, neurons with similar directional tuning are interconnected by a network of horizontal connections, spanning distances of many whisker representations. We also propose specific experiments for testing the predictions of the model by manipulating patterns of whisker inputs experienced during early development. The results suggest that similar intracortical mechanisms guide the development of primate V1 and rat S1

    Strategy Change in Vibrissal Active Sensing during Rat Locomotion

    Get PDF
    During exploration, rats and other small mammals make rhythmic back-and-forth sweeps of their long facial whiskers (macrovibrissae) [1, 2 and 3]. These “whisking” movements are modulated by head movement [4] and by vibrissal sensory input [5 and 6] and hence are often considered “active” in the Gibsonian sense of being purposive and information seeking [7 and 8]. An important hallmark of active sensing is the modification of the control strategy according to context [9]. Using a task in which rats were trained to run circuits for food, we tested the hypothesis that whisker control, as measured by high-speed videography, changes with contextual variables such as environment familiarity, risk of collision, and availability of visual cues. In novel environments, functionally blind rats moved at slow speeds and performed broad whisker sweeps. With greater familiarity, however, they moved more rapidly, protracted their whiskers further, and showed decreased whisking amplitude. These findings indicate a strategy change from using the vibrissae to explore nearby surfaces to using them primarily for “look ahead.” In environments with increased risk of collision, functionally blind animals moved more slowly but protracted their whiskers further. Sighted animals also showed changes in whisker control strategy with increased familiarity, but these changes were different to those of the functionally blind strain. Sighted animals also changed their vibrissal behavior when visual cues were subsequently removed (by being placed in darkness). These contextual influences provide strong evidence of active control and demonstrate that the vibrissal system provides an accessible model of purposive behavior in mammals

    Simultaneous localisation and mapping on a multi-degree of freedom biomimetic whiskered robot

    Get PDF
    A biomimetic mobile robot called “Shrewbot” has been built as part of a neuroethological study of the mammalian facial whisker sensory system. This platform has been used to further evaluate the problem space of whisker based tactile Simultaneous Localisation And Mapping (tSLAM). Shrewbot uses a biomorphic 3-dimensional array of active whiskers and a model of action selection based on tactile sensory attention to explore a circular walled arena sparsely populated with simple geometric shapes. Datasets taken during this exploration have been used to parameterise an approach to localisation and mapping based on probabilistic occupancy grids. We present the results of this work and conclude that simultaneous localisation and mapping is possible given only noisy odometry and tactile information from a 3-dimensional array of active biomimetic whiskers and no prior information of features in the environment

    Multimodal Representation Learning for Place Recognition Using Deep Hebbian Predictive Coding

    Get PDF
    Recognising familiar places is a competence required in many engineering applications that interact with the real world such as robot navigation. Combining information from different sensory sources promotes robustness and accuracy of place recognition. However, mismatch in data registration, dimensionality, and timing between modalities remain challenging problems in multisensory place recognition. Spurious data generated by sensor drop-out in multisensory environments is particularly problematic and often resolved through adhoc and brittle solutions. An effective approach to these problems is demonstrated by animals as they gracefully move through the world. Therefore, we take a neuro-ethological approach by adopting self-supervised representation learning based on a neuroscientific model of visual cortex known as predictive coding. We demonstrate how this parsimonious network algorithm which is trained using a local learning rule can be extended to combine visual and tactile sensory cues from a biomimetic robot as it naturally explores a visually aliased environment. The place recognition performance obtained using joint latent representations generated by the network is significantly better than contemporary representation learning techniques. Further, we see evidence of improved robustness at place recognition in face of unimodal sensor drop-out. The proposed multimodal deep predictive coding algorithm presented is also linearly extensible to accommodate more than two sensory modalities, thereby providing an intriguing example of the value of neuro-biologically plausible representation learning for multimodal navigation

    Whisker Movements Reveal Spatial Attention: A Unified Computational Model of Active Sensing Control in the Rat

    Get PDF
    Spatial attention is most often investigated in the visual modality through measurement of eye movements, with primates, including humans, a widely-studied model. Its study in laboratory rodents, such as mice and rats, requires different techniques, owing to the lack of a visual fovea and the particular ethological relevance of orienting movements of the snout and the whiskers in these animals. In recent years, several reliable relationships have been observed between environmental and behavioural variables and movements of the whiskers, but the function of these responses, as well as how they integrate, remains unclear. Here, we propose a unifying abstract model of whisker movement control that has as its key variable the region of space that is the animal's current focus of attention, and demonstrate, using computer-simulated behavioral experiments, that the model is consistent with a broad range of experimental observations. A core hypothesis is that the rat explicitly decodes the location in space of whisker contacts and that this representation is used to regulate whisker drive signals. This proposition stands in contrast to earlier proposals that the modulation of whisker movement during exploration is mediated primarily by reflex loops. We go on to argue that the superior colliculus is a candidate neural substrate for the siting of a head-centred map guiding whisker movement, in analogy to current models of visual attention. The proposed model has the potential to offer a more complete understanding of whisker control as well as to highlight the potential of the rodent and its whiskers as a tool for the study of mammalian attention

    Neural Computation via Neural Geometry: A Place Code for Inter-whisker Timing in the Barrel Cortex?

    Get PDF
    The place theory proposed by Jeffress (1948) is still the dominant model of how the brain represents the movement of sensory stimuli between sensory receptors. According to the place theory, delays in signalling between neurons, dependent on the distances between them, compensate for time differences in the stimulation of sensory receptors. Hence the location of neurons, activated by the coincident arrival of multiple signals, reports the stimulus movement velocity. Despite its generality, most evidence for the place theory has been provided by studies of the auditory system of auditory specialists like the barn owl, but in the study of mammalian auditory systems the evidence is inconclusive. We ask to what extent the somatosensory systems of tactile specialists like rats and mice use distance dependent delays between neurons to compute the motion of tactile stimuli between the facial whiskers (or ‘vibrissae’). We present a model in which synaptic inputs evoked by whisker deflections arrive at neurons in layer 2/3 (L2/3) somatosensory ‘barrel’ cortex at different times. The timing of synaptic inputs to each neuron depends on its location relative to sources of input in layer 4 (L4) that represent stimulation of each whisker. Constrained by the geometry and timing of projections from L4 to L2/3, the model can account for a range of experimentally measured responses to two-whisker stimuli. Consistent with that data, responses of model neurons located between the barrels to paired stimulation of two whiskers are greater than the sum of the responses to either whisker input alone. The model predicts that for neurons located closer to either barrel these supralinear responses are tuned for longer inter-whisker stimulation intervals, yielding a topographic map for the inter-whisker deflection interval across the surface of L2/3. This map constitutes a neural place code for the relative timing of sensory stimuli
    corecore